Goto

Collaborating Authors

 cross-channel communication network


Reviews: Cross-channel Communication Networks

Neural Information Processing Systems

The authors propose an approach to increase the representation power of neural network by introducing communication between the neurons in the same layer. To this end a neural communication bloc is introduced. It first encodes the feature map of each neuron to reduce its dimensionality by a factor of 8. Then an attention-based GCN is used to propagate the information between the neurons via a fully-connected graph. In practice, a weighted sum of the neuron encodings is computed for each node, where the weights are determined by the nodes' features similarity. Finally, the updated representation is decoded to the original resolution and added to the original features. Importantly, this model applies the same operations to every neuron, thus the number of parameters is independent of the feature dimensionality, but dependent on the spatial size of the feature map.


Reviews: Cross-channel Communication Networks

Neural Information Processing Systems

This paper introduces a novel approach for inter-neuron communication in the same layer in a standard neural network, a Neural Communication (NC) block. The NC block results in slightly better performance (absolute improvement of 1%) than previous similar methods (such as Squeeze-Excitation blocks) on image classification, semantic segmentation and object detection, and reviewers find the formulation to be more general and better motivated. The authors demonstrate that shallower networks with the NC block can achieve similar performance to deeper networks. They also provide a detailed and useful analysis of the properties of the learned representations, and show that the NC blocks leads to learning neurons that are less correlated. Reviewers are concerned that the improvement is only marginal, and that comparison to non-local networks is not reported.


Cross-channel Communication Networks

Neural Information Processing Systems

Convolutional neural networks process input data by sending channel-wise feature response maps to subsequent layers. While a lot of progress has been made by making networks deeper, information from each channel can only be propagated from lower levels to higher levels in a hierarchical feed-forward manner. When viewing each filter in the convolutional layer as a neuron, those neurons are not communicating explicitly within each layer in CNNs. We introduce a novel network unit called Cross-channel Communication (C3) block, a simple yet effective module to encourage the neuron communication within the same layer. The C3 block enables neurons to exchange information through a micro neural network, which consists of a feature encoder, a message communicator, and a feature decoder, before sending the information to the next layer.


Cross-channel Communication Networks

Yang, Jianwei, Ren, Zhile, Gan, Chuang, Zhu, Hongyuan, Parikh, Devi

Neural Information Processing Systems

Convolutional neural networks process input data by sending channel-wise feature response maps to subsequent layers. While a lot of progress has been made by making networks deeper, information from each channel can only be propagated from lower levels to higher levels in a hierarchical feed-forward manner. When viewing each filter in the convolutional layer as a neuron, those neurons are not communicating explicitly within each layer in CNNs. We introduce a novel network unit called Cross-channel Communication (C3) block, a simple yet effective module to encourage the neuron communication within the same layer. The C3 block enables neurons to exchange information through a micro neural network, which consists of a feature encoder, a message communicator, and a feature decoder, before sending the information to the next layer.